Symmetry , order , entropy and information

نویسنده

  • György Darvas
چکیده

Conditions of applicability of the laws established for thermodynamic entropy do not necessarily fit to the entropy defined for information. Therefore, one must handle carefully the informational conclusions derived by mathematical analogies from the laws that hold for thermodynamic entropy. Entropy, and the arrow of its change are closely related to the arrows of the change of symmetry and of orderliness. Symmetry and order are interpreted in different ways in statistical thermodynamics, in symmetrology, and in evolution; and their relation to each other is also equivocal. Evolution is meant quite different in statistical physics and in philosophical terms. Which of the different interpretations can be transferred to the description of information? Entropy, introduced by Shannon on mathematical analogy borrowed from thermodynamics, is a mean to characterise information. One is looking for a possibly most general information theory. Generality of the sought theory can be qualified by its applicability to all (or at least the more) kinds of information. However, I express doubts, whether entropy is a property to characterise all kinds of information. Entropy plays an important role in information theory. This concept has been borrowed from physics, more precisely from thermodynamics, and applied to information by certain formal analogies. Several authors, having contributed to the FIS discussion and published papers in the periodical Entropy, emphasized the also existing differences in contrast to the analogies. Since the relations of entropy as applied in information theory to symmetry are taken from its physical origin, there is worth to take a glance at the ambiguous meaning of this term in physics in its relation to order and symmetry, respectively.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

] 4 N ov 1 99 9 Entropy , Macroscopic Information , and Phase Transitions

The relationship between entropy and information is reviewed, taking into account that information is stored in macroscopic degrees of freedom, such as the order parameter in a system exhibiting spontaneous symmetry breaking. It is shown that most problems of the relationship between entropy and information, embodied in a variety of Maxwell demons, are also present in any symmetry breaking tran...

متن کامل

The Nature of Molecular Recognition, Self-assembly and Self-organization: Revised Information Theory and the Relation of Entropy, Symmetry and Diversity

Molecular interactions seek the most symmtric static structures. However, symmetry has been mainly regarded as a mathematical attribute [1-2]. Curie-Rosen symmetry principle [2] ] is a higher symmetry−higher stability relation that has been seldom, if ever, accepted for consideration of structural stability and process spontaneity (or process irreversibility). Most people accept the higher symm...

متن کامل

Nonextensive Entropy, Prior PDFs and Spontaneous Symmetry Breaking

We show that using nonextensive entropy can lead to spontaneous symmetry breaking when a parameter changes its value from that applicable for a symmetric domain, as in field theory. We give the physical reasons and also show that even for symmetric Dirichlet priors, such a definition of the entropy and the parameter value can lead to asymmetry when entropy is maximized.

متن کامل

New Upper Bound and Lower Bound for Degree-Based Network Entropy

The degree-based network entropy which is inspired by Shannon’s entropy concept becomes the information-theoretic quantity for measuring the structural information of graphs and complex networks. In this paper, we study some properties of the degree-based network entropy. Firstly we develop a refinement of Jensen’s inequality. Next we present the new and more accurate upper bound and lower boun...

متن کامل

International Journal of Molecular Science 2001, 2, 1-9

Symmetry is a measure of indistinguishability. Similarity is a continuous measure of imperfect symmetry. Lewis' remark that “gain of entropy means loss of information” defines the relationship of entropy and information. Three laws of information theory have been proposed. Labeling by introducing nonsymmetry and formatting by introducing symmetry are defined. The function L ( L=lnw, w is the nu...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2005